# Specialized for information retrieval
Stella Pl Retrieval
This is a text encoder based on stella_en_1.5B_v5 and further fine-tuned for Polish information retrieval tasks, specifically optimized for Polish information retrieval.
Text Embedding
Transformers Other

S
sdadas
913
11
Mmlw Retrieval E5 Large
Apache-2.0
MMLW is a neural text encoder for Polish, optimized for information retrieval tasks, capable of converting queries and passages into 1024-dimensional vectors
Text Embedding
Transformers Other

M
sdadas
56
3
Mmlw Retrieval E5 Small
Apache-2.0
MMLW (I Must Get Better Messages) is a neural text encoder for Polish, optimized for information retrieval tasks, capable of converting queries and passages into 384-dimensional vectors.
Text Embedding
Transformers Other

M
sdadas
34
1
Dense Encoder Msmarco Distilbert Word2vec256k MLM 210k Emb Updated
DistilBERT model with word2vec-initialized 256k vocabulary, optimized for sentence similarity and information retrieval tasks
Text Embedding
Transformers

D
vocab-transformers
23
0
Dense Encoder Msmarco Distilbert Word2vec256k
A sentence encoder based on msmarco-word2vec256000-distilbert-base-uncased, using a word2vec-initialized 256k vocabulary, specifically designed for sentence similarity tasks
Text Embedding
Transformers

D
vocab-transformers
38
0
Featured Recommended AI Models